2 research outputs found

    Augmentic Compositional Models for Knowledge Base Completion Using Gradient Representations

    Get PDF
    Neural models of Knowledge Base data have typically employed compositional representations of graph objects: entity and relation embeddings are systematically combined to evaluate the truth of a candidate Knowedge Base entry. Using a model inspired by Harmonic Grammar, we propose to tokenize triplet embeddings by subjecting them to a process of optimization with respect to learned well-formedness conditions on Knowledge Base triplets. The resulting model, known as Gradient Graphs, leads to sizable improvements when implemented as a companion to compositional models. Also, we show that the supracompositional triplet token embeddings it produces have interpretable properties that prove helpful in performing inference on the resulting triplet representations

    STRUCTURE ASSEMBLY IN KNOWLEDGE BASE REPRESENTATION

    No full text
    A primary goal of connectionist cognitive science is to provide the technical apparatus for modeling cognitive processes as implemented in brainlike systems. From the structure of classical cognitive theories--alphabets of discrete symbols along with algebraic operations on those primitive symbols|one can derive key properties attributed to "higher-order cognition" by authors like Fodor and Pylyshyn--systematicity, productivity, and compositionality. This fact has led to theories of that type to serve as the enduring backbone of cognitive science. Connectionist networks differ from these classical systems not just in that they are implemented in real numbers while the former operate over discrete sets, but also in that they can represent system-states that cannot be factored into algebraic combinations of primitive symbols that are, in a formal sense, noncompositional. This dissertation develops, examines, and evaluates a series of models that straddle that divide. Cognitively-oriented connectionist models in the tradition of Vector Symbolic Architectures (VSAs) provide a framework by which neural network models can integrate Fodor and Pylyshyn's insights, as well as the corresponding architectural commitments of cognitive theory. Other processes, based in Harmony Maximization, that adjust these structured representations to satisfy learned constraints. We present three models that use Tensor Product Representations (TPRs) and other VSAs closely related to them, placing these in a common formal framework and then applying them to Knowledge Base Completion: the task of storing large-scale inventories of facts (e.g. WordNet, Freebase, Wikidata) in representations that allow those databases to be extended via inexact inference. In typical approaches, graph representations are obtained compositionally by taking static vector representations of graph elements (entities and relations) and combining them systematically in order to derive a score. The first two models combine the compositional operations of VSAs with context-modulation processes based in Harmonic Grammar. A third model examines the proposition that spatial structure implicit in TPRs--with a number of spatial directions equal to the order of the tensor--can be used as an organizing principle for the features encoded in the trilinear tensors occurring in the graph representation setting. Each of the models, we show, performs at the state of the art in Knowledge Base Completion, and we explore the qualitative aspects of the representations that they learn
    corecore